Why are DBNs sparse?
نویسندگان
چکیده
Real stochastic processes operating in continuous time can be modeled by sets of stochastic differential equations. On the other hand, several popular model families, including hidden Markov models and dynamic Bayesian networks (DBNs), use discrete time steps. This paper explores methods for converting DBNs with infinitesimal time steps into DBNs with finite time steps, to enable efficient simulation and filtering over long periods. An exact conversion—summing out all intervening time slices between two steps— results in a completely connected DBN, yet nearly all human-constructed DBNs are sparse. We show how this sparsity arises from well-founded approximations resulting from differences among the natural time scales of the variables in the DBN. We define an automated procedure for constructing an approximate DBN model for any desired time step and prove error bounds for the approximation. We illustrate the method by generating a series of approximations to a simple pH model for the human body, demonstrating speedups of several orders of magnitude compared to the original model.
منابع مشابه
Using deep belief networks for vector-based speaker recognition
Deep belief networks (DBNs) have become a successful approach for acoustic modeling in speech recognition. DBNs exhibit strong approximation properties, improved performance, and are parameter efficient. In this work, we propose methods for applying DBNs to speaker recognition. In contrast to prior work, our approach to DBNs for speaker recognition starts at the acoustic modeling layer. We use ...
متن کاملExtended Double-Base Number System with Applications to Elliptic Curve Cryptography
We investigate the impact of larger digit sets on the length of Double-Base Number system (DBNS) expansions. We present a new representation system called extended DBNS whose expansions can be extremely sparse. When compared with double-base chains, the average length of extended DBNS expansions of integers of size in the range 200– 500 bits is approximately reduced by 20% using one precomputed...
متن کاملA sparse-response deep belief network based on rate distortion theory
Deep belief networks (DBNs) are currently the dominant technique for modeling the architectural depth of brain, and can be trained efficiently in a greedy layer-wise unsupervised learning manner. However, DBNs without a narrow hidden bottleneck typically produce redundant, continuous-valued codes and unstructured weight patterns. Taking inspiration from rate distortion (RD) theory, which encode...
متن کاملA hybrid DBNS processor for DSP computation
This paper introduces a modification to an index calculus representation for the Double-Base Number System (DBNS). The DBNS uses the bases 2 and 3; it is redundant (very sparse) and has a simple two-dimensional representation. An extremely sparse form of the DBNS uses a single non-zero digit to represent any real number with arbitrary precision. In this case the single digit can be identified b...
متن کاملDeep networks for robust visual recognition
Deep Belief Networks (DBNs) are hierarchical generative models which have been used successfully to model high dimensional visual data. However, they are not robust to common variations such as occlusion and random noise. We explore two strategies for improving the robustness of DBNs. First, we show that a DBN with sparse connections in the first layer is more robust to variations that are not ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2010